AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
3 trillion token pre-training

# 3 trillion token pre-training

Tinyllama 1.1B Chat V0.6
Apache-2.0
TinyLlama is a 1.1 billion parameter Llama model pre-trained on 3 trillion tokens, suitable for scenarios with limited computation and memory resources.
Large Language Model English
T
TinyLlama
11.60k
98
Tinyllama 1.1B Chat V0.1
Apache-2.0
TinyLlama is a compact 1.1B-parameter language model pre-trained on 3 trillion tokens, suitable for applications with limited computational resources.
Large Language Model Transformers English
T
TinyLlama
6,076
55
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase